partition bge
- Europe > Switzerland > Basel-City > Basel (0.05)
- Europe > Switzerland > Zürich > Zürich (0.04)
- Europe > United Kingdom > England > Greater London > London (0.04)
- Europe > United Kingdom > England > Cambridgeshire > Cambridge (0.04)
- Information Technology > Artificial Intelligence > Representation & Reasoning > Uncertainty > Bayesian Inference (1.00)
- Information Technology > Artificial Intelligence > Machine Learning > Learning Graphical Models > Directed Networks > Bayesian Learning (1.00)
- Information Technology > Modeling & Simulation (0.95)
- Information Technology > Data Science (0.93)
A Supplementary Material
Figure A.1: The median difference in GP log score between the forward and backward model, with Figure A.3 shows the distribution of Cyclic graphs occasionally returned by DiBS+ were discarded. We performed an additional experiment comparing the ability of the different methods to model the posterior distribution over DAGs as a function of their run-time. Figure A.4 shows the reverse K-L divergence between the "true" posterior (obtained by enumerating every possible structure and Figure A.4: Reverse K-L divergence between the true posterior and the BGe posterior (green), DiBS+ In figure A.5 we compare the number of score evaluations performed by the different methods when Figure A.5: Distribution of number of scores evaluated by the different methods. Figure A.9 shows the corresponding run-times needed to run
A Bayesian Take on Gaussian Process Networks
Giudice, Enrico, Kuipers, Jack, Moffa, Giusi
Gaussian Process Networks (GPNs) are a class of directed graphical models which employ Gaussian processes as priors for the conditional expectation of each variable given its parents in the network. The model allows the description of continuous joint distributions in a compact but flexible manner with minimal parametric assumptions on the dependencies between variables. Bayesian structure learning of GPNs requires computing the posterior over graphs of the network and is computationally infeasible even in low dimensions. This work implements Monte Carlo and Markov Chain Monte Carlo methods to sample from the posterior distribution of network structures. As such, the approach follows the Bayesian paradigm, comparing models via their marginal likelihood and computing the posterior probability of the GPN features. Simulation studies show that our method outperforms state-of-the-art algorithms in recovering the graphical structure of the network and provides an accurate approximation of its posterior distribution.
- Europe > Switzerland > Basel-City > Basel (0.05)
- Europe > Switzerland > Zürich > Zürich (0.04)
- North America > United States > New York > New York County > New York City (0.04)
- (2 more...)
- Information Technology > Modeling & Simulation (1.00)
- Information Technology > Artificial Intelligence > Representation & Reasoning > Uncertainty > Bayesian Inference (1.00)
- Information Technology > Artificial Intelligence > Machine Learning > Learning Graphical Models > Directed Networks > Bayesian Learning (1.00)